Wavelet-generalized least squares: a new BLU estimator of linear regression models with 1/f errors.
نویسندگان
چکیده
Long-memory noise is common to many areas of signal processing and can seriously confound estimation of linear regression model parameters and their standard errors. Classical autoregressive moving average (ARMA) methods can adequately address the problem of linear time invariant, short-memory errors but may be inefficient and/or insufficient to secure type 1 error control in the context of fractal or scale invariant noise with a more slowly decaying autocorrelation function. Here we introduce a novel method, called wavelet-generalized least squares (WLS), which is (to a good approximation) the best linear unbiased (BLU) estimator of regression model parameters in the context of long-memory errors. The method also provides maximum likelihood (ML) estimates of the Hurst exponent (which can be readily translated to the fractal dimension or spectral exponent) characterizing the correlational structure of the errors, and the error variance. The algorithm exploits the whitening or Karhunen-Loéve-type property of the discrete wavelet transform to diagonalize the covariance matrix of the errors generated by an iterative fitting procedure after both data and design matrix have been transformed to the wavelet domain. Properties of this estimator, including its Cramèr-Rao bounds, are derived theoretically and compared to its empirical performance on a range of simulated data. Compared to ordinary least squares and ARMA-based estimators, WLS is shown to be more efficient and to give excellent type 1 error control. The method is also applied to some real (neurophysiological) data acquired by functional magnetic resonance imaging (fMRI) of the human brain. We conclude that wavelet-generalized least squares may be a generally useful estimator of regression models in data complicated by long-memory or fractal noise.
منابع مشابه
Heteroskedastic linear regression: steps towards adaptivity, efficiency, and robustness
In linear regression with heteroscedastic errors, the Generalized Least Squares (GLS) estimator is optimal, i.e., it is the Best Linear Unbiased Estimator (BLUE). The Ordinary Least Squares (OLS) estimator is suboptimal but still valid, i.e., unbiased and consistent. Halbert White, in his seminal paper (Econometrica, 1980) used the OLS residuals in order to obtain an estimate of the standard er...
متن کاملA New Stochastic Restricted Biased Estimator under Heteroscedastic or Correlated Error
In this paper, under the linear regression model with heteroscedastic and/or correlated errors when the stochastic linear restrictions on the parameter vector are assumed to be held, a generalization of the ordinary mixed estimator (GOME), ordinary ridge regression estimator (GORR) and Generalized least squares estimator (GLSE) is proposed. The performance of this new estimator against GOME, GO...
متن کاملTime-Series Regression and Generalized Least Squares in R An Appendix to An R Companion to Applied Regression, Second Edition
Generalized least-squares (GLS ) regression extends ordinary least-squares (OLS) estimation of the normal linear model by providing for possibly unequal error variances and for correlations between different errors. A common application of GLS estimation is to time-series regression, in which it is generally implausible to assume that errors are independent. This appendix to Fox and Weisberg (2...
متن کاملConditionally Unbiased Bounded Influence Estimation in General Regression
Iu this paper we study robust estimation in general models for the dependence of a response y on an explanatory vector z. We extend previous work on bounded influence estimators in linear regression. Second we construct optimal bounded influence estimators for generalized linear models. We consider the class of estimators defined by an estimating equation with a conditionally unbiased score flw...
متن کاملToward optimal model averaging in regression models with time series errors
Consider a regression model with infinitely many parameters and time series errors. We are interested in choosing weights for averaging across generalized least squares (GLS) estimators obtained from a set of approximating models. However, GLS estimators, depending on the unknown inverse covariance matrix of the errors, are usually infeasible. We therefore construct feasible generalized least s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- NeuroImage
دوره 15 1 شماره
صفحات -
تاریخ انتشار 2002